The previous slide formalized Big-O notation as the asymptotic upper bound, $f(n) \le c \cdot g(n)$. To truly understand Big-O, we must apply this definition formally to prove the hierarchical relationship between common complexity classes.
- Understanding this proof structure shows us why we drop constants and focus solely on the dominating term $g(n)$.
-
Proof 1: Logarithmic is bounded by Linear
- We want to prove that $f(n) = 10000 \cdot \log_2(n)$ is $O(n)$. We must find constants $c > 0$ and $n_0 \ge 1$ such that: $$ 10000 \cdot \log_2(n) \le c \cdot n \text{ for all } n > n_0 $$
- Since $\log_2(n)$ grows much slower than $n$, we know such constants exist.
- If we choose $c = 10000$, the inequality becomes $\log_2(n) \le n$.
- This inequality holds true for all $n \ge 1$.
- We formally prove that $10000 \cdot \log_2(n)$ is $O(n)$ by selecting $c = 10000$ and $n_0 = 1$.
-
Proof 2: Linear is NOT bounded by Logarithmic (Proof by Contradiction)
- We want to prove that $n$ is not $O(10000 \cdot \log_2(n))$.
- 1. Assume the Opposite: Assume $n = O(\log_2(n))$.
- 2. Apply Definition: This assumption requires constants $c$ and $n_0$ such that: $$ n \le c \cdot (10000 \cdot \log_2(n)) \text{ for all } n > n_0 $$
- 3. Analyze the Limit: Let $c' = 10000 \cdot c$. The inequality simplifies to $n \le c' \cdot \log_2(n)$. Dividing by $n$: $$ 1 \le c' \cdot \frac{\log_2(n)}{n} $$
- 4. Contradiction: As $n$ approaches infinity, $\frac{\log_2(n)}{n}$ approaches 0. $$ \lim_{n \to \infty} \left( c' \cdot \frac{\log_2(n)}{n} \right) = 0 $$
- Conclusion: This leads to the contradiction $1 \le 0$, which is impossible. The linear function $n$ eventually grows faster than any scaled logarithmic function. Thus, $n \neq O(\log_2(n))$. The order of complexity growth matters and cannot be reversed.
Proof Summaries
| Proof | Result | Conclusion |
|---|---|---|
| $10000 \cdot \log_2(n)$ vs $n$ | $O(n)$ | Logarithmic is bounded by Linear. |
| $n$ vs $10000 \cdot \log_2(n)$ | $\neq O(\log_2(n))$ | Linear is NOT bounded by Logarithmic. |